![]() DEVICE AND METHOD FOR THREE-DIMENSIONAL RECONSTRUCTION OF A SCENE BY IMAGE ANALYSIS
专利摘要:
This device comprises an image pickup apparatus (4) for capturing images of the scene, an analyzing device (12) for calculating a three-dimensional reconstruction of the scene from at least one image (10) of the scene taken by the image pickup apparatus (4), and a projection device (14) for projecting a complementary first light pattern and a second light pattern onto the examined scene, the first light pattern and the second light pattern being projecting along distinct projection axes (A1; A2) forming a non-zero angle with each other so as to be superimposed to form a uniform image of uniform intensity in a projection plane. 公开号:FR3020871A1 申请号:FR1454153 申请日:2014-05-07 公开日:2015-11-13 发明作者:Yannick Caulier 申请人:Areva NP SAS; IPC主号:
专利说明:
[0001] The present invention relates to the field of the three-dimensional reconstruction of a scene from one or more images of the scene taken with the help of a camera. image taking and analysis of the images taken by means of an analysis device. The three-dimensional reconstruction of a scene by image analysis finds applications in the industry, in particular to determine the three-dimensional relief of a part or a surface situated in an inaccessible place, such as for example in a nuclear reactor, especially in a steam generator of the nuclear reactor. It allows in particular to control the surface condition of a weld bead made between two parts, or to measure the roughness of a surface. On an image taken by an imaging device, elements in a sharpness area appear sharp while elements outside the sharpness area appear blurred. The area of sharpness is a slice of the space between a net front plane and a rear net plane, parallel and separated by a distance called depth of field. The net front plane and the rear net plane are located on both sides of a plane of maximum sharpness. It is possible to determine a three-dimensional reconstruction of a scene by analyzing optical blur on an image. [0002] In a three-dimensional reconstruction method called DFF ("Depth From Focus"), several images of a scene are taken with an image taking apparatus, shifting each time the maximum sharpness plane along the image. Image-taking axis between the images, without moving the elements of the scene between the images. By then combining the sharp regions of the different captured images, it is possible to reconstruct the three-dimensional relief of the scene plan by plane. In another three-dimensional reconstruction method called DFD ("Depth From Defocus"), the relief of the scene is reconstructed by analyzing the level of optical blur in the images. On each image, the more a region has a high level of blur, the more it is shifted in depth from the sharpness area. [0003] By knowing the optical parameters of the image taking apparatus, it is possible to determine the depth of the point of the scene associated with each pixel of the image. The level of optical blur on an image is measured for example by measuring the contrast on the image. A low contrast indicates a fuzzy region of the image while a high contrast indicates a sharp region of the image. [0004] It is possible to project on the scene a textured luminous pattern which increases the contrast to improve the accuracy of optical blur analysis on the captured images. The luminous pattern includes for example parallel lines, fringes or a checkerboard. Nevertheless, this does not give complete satisfaction in terms of accuracy, for example for the three-dimensional reconstruction of low roughness surfaces. [0005] One of the aims of the present invention is to propose a device for three-dimensional reconstruction of a scene by image analysis taken with the aid of an image taking apparatus, which is easy to use and has a satisfactory accuracy. For this purpose, the invention proposes a device for three-dimensional reconstruction of a scene by image analysis, comprising an image taking apparatus for capturing images of the scene, an analysis device for calculating a three-dimensional reconstruction of the image. the scene from at least one image of the scene taken by the image pickup apparatus, and a projection device for projecting a complementary first light pattern and a second light pattern onto the examined scene, the first light pattern. and the second light pattern being projected along distinct projection axes forming a non-zero angle between them, so as to be superimposed forming a uniform image of uniform intensity in a projection plane. The three-dimensional reconstruction device may comprise the following optional features, taken alone or in combination: the first light pattern and the second light pattern each comprise light areas and dark areas forming a geometric pattern, in particular light fringes; the analysis device is programmed to calculate a three-dimensional reconstruction from a measurement of the variation of the homogeneity of the luminous intensity on each image of the scene; the analysis device is programmed to calculate, from at least one image of the scene, the position of each point of the scene relative to the projection plane and / or to calculate, from at least one image captured, a measure of the depth of each point of the scene according to the variation of the homogeneity of the intensity between the different points of the scene; the reconstruction device is programmed to capture a series of images of the scene by moving the maximum sharpness plane of the image-taking apparatus relative to the scene between each shot, the analysis device being programmed to calculate a three-dimensional reconstruction of the scene plan by plane from the series of images; The invention also relates to a method of three-dimensional reconstruction of a scene by image analysis, comprising the steps of: projecting a first light pattern and a second complementary light pattern onto the examined scene, the first light pattern and the second pattern light projections being projected along distinct projection axes forming a non-zero angle between them, so as to be superimposed by forming a uniform projected image of uniform intensity in a projection plane; - capture at least one image of the scene being examined; then - calculate a three-dimensional reconstruction of the scene by analyzing the captured image. The three-dimensional reconstruction method may comprise the following optional features, taken alone or in combination: the first pattern and the second pattern each comprise light areas and dark areas forming a geometric pattern, in particular light fringes. the analysis device is programmed to calculate a three-dimensional reconstruction from a measurement of the variation of the homogeneity of the luminous intensity on each image of the scene. the step of calculating a three-dimensional reconstruction comprises calculating, from at least one image of the scene, the position of each point of the scene relative to the projection plane and / or calculating it, starting from at least one captured image, of a measurement of the depth of each point of the scene as a function of the variation of the homogeneity of the intensity between the different points of the scene; the image capturing step includes capturing a series of images of the scene by moving the maximum sharpness plane of the image taking apparatus relative to the scene between each shot, the step of calculating a three-dimensional reconstruction comprising calculating the three-dimensional reconstruction of the scene plan by plane from the series of images. The invention and its advantages will be better understood on reading the description which follows, given solely by way of example, and with reference to the appended drawings, in which: FIG. 1 is a diagrammatic front view of a device for three-dimensional reconstruction of a scene by image analysis; and - Figure 2 is a schematic representation of light patterns projected by the three-dimensional reconstruction device, before and after superposition. The three-dimensional reconstruction device 2 of FIG. 1 is adapted to reconstruct the three-dimensional relief of a scene by image analysis of the scene. [0006] The three-dimensional reconstruction device 2 comprises a digital image taking apparatus 4, for example a digital camera or a digital camera, for taking digital images of the scene. The image pickup apparatus 4 includes a lens 6 for focusing the light of the scene onto a matrix sensor 8. The light sensor 8 illuminated by the scene captures a matrix image of the scene. The matrix image is formed of a matrix of pixels, with each pixel being associated with the parameters (luminous intensity for each elementary color for a color image or gray level for a black image is white). Each pixel in the image corresponds to a point in the scene. [0007] The image-taking apparatus 4 has an image-taking axis X, corresponding to the optical axis of its objective 6. The image-taking apparatus 4 has a maximum sharpness plane Pmax contained in a zone sharpness which is a slice of the space delimited between a net plane before P1 and a net rear plane P2 located on either side of the plane of maximum sharpness Pmax. The maximum sharpness plane Pmax, the net plane before P1 and the net rear plane P2 are perpendicular to the imaging axis X and parallel to each other. The distance between the net plane before P1 and the net rear plane P2 taken along the imaging axis X is the depth of field. [0008] The image pickup apparatus 4 takes pictures in which the elements of the scene included in the sharpness area are sharp and the elements of the scene outside the sharpness area are blurred. The distance from the maximum sharpness plane Pmax to the image taking apparatus 4 and the depth of field are a function of the parameters of the image taking apparatus 4 (focal length, aperture of the diaphragm, etc.). The settings of the image pickup apparatus 4 are adjustable or fixed. Preferably, for the three-dimensional reconstruction, the parameters of the image taking apparatus 4 are chosen or adjusted such that the depth of field is small, for example less than the depth of the scene to be reconstructed. For example, tests have shown that it is possible to reconstruct with a depth resolution of 1/10 mm a scene about 40 mm deep with a depth of field of the order of 20 mm. Larger field depths can also be considered, in the case where the texture of the surface makes it possible to determine the maximum of the contrast curve. [0009] The image taking apparatus 4 comprises an electronic analysis device 12 able to analyze the images captured by the image taking apparatus 4 to perform a three-dimensional reconstruction of the scene by analyzing the captured images. The analysis device 12 comprises a processor and a memory in which is stored a computer application containing software instructions executable by the processor to automatically calculate a three-dimensional reconstruction of the scene by image analysis. The three-dimensional reconstruction device 2 comprises a projection device 14 for projecting onto the scene a first light pattern 16 and a second luminous pattern 18 which are textured and complementary, projected so as to form net elementary images in a projection plane PP situated in the area of sharpness of the image pickup apparatus 4, so as to form a uniform combined image in the projection plane PP. Preferably, as illustrated, the projection plane PP is substantially coincident with the maximum sharpness plane Pmax. The projection device 14 is configured to project a first projection light beam 15 of the first light pattern 16 and a second projection beam 17 of the second light pattern 18 respectively according to a first axis of projection A1 and a second axis of projection A2 making a non-zero angle between them. The first beam 15 and the second beam 17 intersect. At least one projection axis A1, A2 is distinct from the imaging axis X and forms a non-zero angle with the latter. In the illustrated example, each projection axis A1, A2 is distinct from the image-taking axis X and forms a non-zero angle with the latter. The optical axis X here forms the bisector between the projection axes A1, A2. The projection axes A1, A2 are concurrent with the optical axis X. As illustrated in FIG. 1, the projection device 14 comprises a first projector 20 for projecting the first light pattern 16 and a second projector 22 for projecting the light. second light pattern 18. As shown in Figure 2, each light pattern 16, 18 is textured, and has bright areas and dark areas. The light patterns 16 and 18 are here each formed of luminous fringes, that is to say a plurality of alternately parallel and bright parallel straight strips. The light patterns 16, 18 are here in black and white. The light patterns 16, 18 are complementary, so that by superimposing them in the projection plane PP in which the light patterns are sharp, the result is a combined image 24 uniform with a uniform light intensity. The complementary light patterns 16, 18 are superimposed so that a dark area of a light pattern is superimposed on a light area of the other light pattern. Due to the non-zero angle between the projection axes A1, A2, as soon as a point of the scene is located outside the projection plane, it is situated according to its position and its depth: in a zone space where the light patterns are not clear and the light intensity is decreased; in an area of high light intensity, corresponding to the intersection of two light zones of the light patterns 16, 18, and where the light intensity is higher than in the sharpening zone projection plane and / or in a zone of low light intensity, corresponding to the intersection of two dark areas of the two light patterns 16, 18 and where the light intensity is lower than in the projection plane. The luminous intensity of a point of the scene located outside the projection plane PP is different (lower or higher) from that of a point of the scene situated in the projection plane PP. The luminous intensity of each pixel of an image taken by the image apparatus 4 is therefore indicative of the depth of the corresponding point of the scene. The homogeneity of the intensity on the image varies according to the depth of the scene. The further a region of the scene is away from the projection plane, the lower the homogeneity of the intensity on a corresponding area of the image. The closer a region of the scene is to the projection plane, the higher the homogeneity of the intensity on a corresponding area of the image. The image analysis device 12 is programmed to traverse each image taken by the image apparatus 4 so as to detect the variations of intensity homogeneity, so as to determine the position of the points of the scene relative to each other. to the projection plane and / or to measure the depth of the points of the scene corresponding to each pixel. This makes it possible to calculate a three-dimensional reconstruction of the scene. In one embodiment, the image pickup apparatus 4 provides black and white images. A black and white image associates a gray level with each pixel, representative of the luminous intensity of this pixel. In this case, the image analysis device 12 is programmed to analyze the light intensity on such an image based on the gray level of the image. In one embodiment, the image pickup apparatus 4 provides color images. A color image includes three intensity images each associated with a respective color, each intensity image associated with a color associating each pixel of the image with a light intensity in that color. The three colors are for example the three elementary colors (green, yellow, blue). In this case, the image analysis device 12 is programmed to analyze the light intensity on a captured image by analyzing the light intensity on each of the three intensity images and then combining the analyzes, for example by summing for each pixel the luminous intensities of the three intensity images. For example, the image analysis device 12 provides a three-dimensional reconstruction in the form of a depth map of the scene, which associates a depth with each pixel of an image of the scene taken by the camera. picture taking. The image analysis device 12 is programmed to calculate a three-dimensional reconstruction of the scene as a function of the depth associated with each point of the scene. In one embodiment, the three-dimensional reconstruction device 2 is programmed to take a series of images of the scene by jointly shifting the maximum sharpness plane Pmax and the projection plane PP between each shot, the analysis device. image 12 being programmed to calculate the three-dimensional reconstruction of the scene plan by plane, for example by determining for each image the net elements located in the projection plane. In order to shift the maximum sharpness plane Pmax and the projection plane PP, the image taking apparatus 4 is adjustable and / or movable along its imaging axis X, and / or the projection device 14 is adjustable and / or movable along the image-taking axis X. The scene is here formed by a surface 22 of an object 24 that it is desired to examine. In operation, the object 24 is disposed in the field of view of the image pickup apparatus 4, the surface to be examined being directed towards the image pickup apparatus 4. [0010] The projection device 14 projects the light patterns 16, 18 onto the surface 22 to be examined. The light patterns 16, 18 are superimposed in the projection plane PP situated in the sharpness zone P1-P2 of the image-taking apparatus 4, forming a lighting of uniform luminous intensity in the projection plane PP. The image taking apparatus 4 takes at least one image 10 of the scene and supplies the image to the analysis device 12 which reconstructs the scene in three dimensions from the or each image, for example by calculating a depth for each pixel of an image taken by the image pickup apparatus 4. Thanks to the invention, it is possible to perform a three-dimensional reconstruction of a scene with satisfactory accuracy. In particular, it is possible to reconstruct in three dimensions a surface having a low roughness. [0011] Indeed, the depth measurement by optical blur analysis on an image is usually based on a contrast measurement, which is a relative measure. For the measurement of the roughness of a surface, the level of blur measured on an image of the surface thus depends on the roughness of the surface, and if a surface has a low roughness, the differences in level of blur are low between regions of the image and the measurement accuracy is low. On the other hand, the projection of superimposed complementary light patterns makes it possible to implement a depth measurement based on the variation of the intensity homogeneity which does not depend on the overall roughness of the surface, and which allows a depth measurement. precise, even for a low roughness surface.
权利要求:
Claims (10) [0001] 1. A device for three-dimensional reconstruction of a scene by image analysis, comprising an image pickup apparatus (4) for capturing images of the scene, an analysis device (12) for calculating a three-dimensional reconstruction of the scene from at least one image (10) of the scene taken by the image pickup apparatus (4), and a projection device (14) for projecting a complementary first light pattern and a second light pattern on the examined scene, the first light pattern and the second light pattern being projected along distinct projection axes (Al; A2) forming a non-zero angle between them, so as to be superimposed forming a uniform image of uniform intensity in a projection plan. [0002] 2. A three-dimensional reconstruction device according to claim 1, wherein the first light pattern and the second light pattern each comprise light areas and dark areas forming a geometric pattern, including light fringes. [0003] 3. A three-dimensional reconstruction device according to claim 1 or claim 2, wherein the analysis device is programmed to calculate a three-dimensional reconstruction from a measure of the variation in the homogeneity of the light intensity on each image (10) of the scene. [0004] 4. A three-dimensional reconstruction device according to any one of the preceding claims, wherein the analysis device (12) is programmed to calculate, from at least one image (10) of the scene, the position of each point of the scene relative to the projection plane and / or for calculating, from at least one captured image, a measurement of the depth of each point of the scene as a function of the variation in the homogeneity of the intensity between the different points of the scene. [0005] 5. A three-dimensional reconstruction device according to any preceding claim, programmed to capture a series of images (10) of the scene by moving the maximum sharpness plane of the image pickup apparatus (4) by report to the scene between each shot, the analysis device (12) being programmed to calculate a three-dimensional reconstruction of the scene plan by plane from the series of images (10). [0006] 6. A method for three-dimensional reconstruction of a scene by image analysis, comprising the steps of: projecting a first complementary light pattern and a second light pattern onto the examined scene, the first light pattern and the second light pattern being projected along distinct projection axes (Al; A2) forming a zero angle between them, so as to be superimposed by forming a uniform projected image of uniform intensity in a projection plane; - capture at least one image of the scene being examined; then - calculate a three-dimensional reconstruction of the scene by analyzing the captured image. [0007] 7. A method of three-dimensional reconstruction according to claim 7, wherein the first pattern and the second pattern each comprise light areas and dark areas forming a geometric pattern, including light fringes. [0008] 8. A three-dimensional reconstruction method according to claim 6 or claim 7, wherein the analysis device is programmed to calculate a three-dimensional reconstruction from a measure of the variation of the homogeneity of the light intensity on each image (10) of the scene. [0009] 9. A method of three-dimensional reconstruction according to any one of claims 6 to 8, wherein the step of calculating a three-dimensional reconstruction comprises calculating, from at least one image (10) of the scene, the position of each point of the scene relative to the projection plane and / or calculates, from at least one captured image, a measurement of the depth of each point of the scene according to the variation of the homogeneity intensity between the different points of the scene. [0010] 10. A three-dimensional reconstruction method according to any one of claims 6 to 9, wherein the image capture step comprises capturing a series of images (10) of the scene by moving the plane of sharpness. of the image taking apparatus (4) relative to the scene between each shot, the step of calculating a three-dimensional reconstruction comprising the calculation of the three-dimensional reconstruction of the scene plan by plane from the series of images (10).
类似技术:
公开号 | 公开日 | 专利标题 EP3140611B1|2018-06-06|Device and method for three-dimensional reconstruction of a scene by image analysis TWI426296B|2014-02-11|Method and system for three-dimensional polarization-based confocal microscopy Hasinoff et al.2009|Confocal stereo JP2013029496A|2013-02-07|Device evaluating depth of element in three-dimensional scene EP2465255B1|2013-06-05|Image-capture system and method with two operating modes KR100820722B1|2008-04-11|Image subtraction of illumination artifacts FR2940449A1|2010-06-25|METHOD FOR NON-DESTRUCTIVE CONTROL OF A MECHANICAL PART US9438887B2|2016-09-06|Depth measurement apparatus and controlling method thereof JP2017150878A|2017-08-31|Image processing device, imaging device, and image processing program JP6479666B2|2019-03-06|Design method of passive single channel imager capable of estimating depth of field Kuciš et al.2012|Simulation of camera features CA2880145C|2020-07-28|Method for the non-destructive testing of a blade preform JP2018527572A|2018-09-20|Measurement of rotational position of lenticular lens sheet TW201600223A|2016-01-01|Tool inspection method and tool inspection device FR2869983A1|2005-11-11|METHOD FOR MEASURING THE POSITION AND / OR THE ORIENTATION OF OBJECTS USING IMAGE PROCESSING METHODS JPH11108625A|1999-04-23|Surface shape measuring apparatus BE1015708A3|2005-07-05|Method for measuring the height of spheres or hemispheres. EP3230713B1|2020-07-29|Method for obtaining an image of a sample, and associated lens-free imaging system KR102368453B1|2022-02-28|Device and method for three-dimensional reconstruction of a scene by image analysis JP2009236760A|2009-10-15|Image detection device and inspection apparatus TW201812705A|2018-04-01|Optical measurement of step size and plated metal thickness FR2966257A1|2012-04-20|METHOD AND APPARATUS FOR CONSTRUCTING A RELIEVE IMAGE FROM TWO-DIMENSIONAL IMAGES JP5768349B2|2015-08-26|Slit light intensity distribution design method and light cutting uneven surface wrinkle detecting device TWI604221B|2017-11-01|Method for measuring depth of field and image pickup device using the same JP2021050948A|2021-04-01|Measurement device
同族专利:
公开号 | 公开日 US10113867B2|2018-10-30| CN106255863A|2016-12-21| FR3020871B1|2017-08-11| US20170076458A1|2017-03-16| CN106255863B|2019-09-13| ES2686195T3|2018-10-16| EP3140611A1|2017-03-15| WO2015169873A1|2015-11-12| EP3140611B1|2018-06-06| KR20170002409A|2017-01-06|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 FR2292213A1|1974-11-21|1976-06-18|Cem Comp Electro Mec|Map contour-line reading instrument - determines relative position of lines using moire fringe effect produced by structured beams of light| US20100201811A1|2009-02-12|2010-08-12|Prime Sense Ltd.|Depth ranging with moire patterns| US20120133954A1|2009-07-29|2012-05-31|Canon Kabushiki Kaisha|Measuring apparatus, measuring method, and program| CN100570278C|2007-05-09|2009-12-16|哈尔滨理工大学|The structure light 3 D measuring method of moving based on edge gray code and line| JP2009031150A|2007-07-27|2009-02-12|Omron Corp|Three-dimensional shape measuring device, three-dimensional shape measurement method, three-dimensional shape measurement program, and record medium| CN201876656U|2010-09-02|2011-06-22|姚征远|Projection device for multi-angle monochromatic light three-dimensional projection measurement| US8340456B1|2011-10-13|2012-12-25|General Electric Company|System and method for depth from defocus imaging|FR3020678B1|2014-04-30|2021-06-25|Areva Np|PHOTOTHERMAL EXAMINATION PROCESS AND CORRESPONDING EXAMINATION SET| CN106580506B|2016-10-25|2018-11-23|成都频泰医疗设备有限公司|Timesharing 3 D scanning system and method| WO2018151008A1|2017-02-14|2018-08-23|日本電気株式会社|Image recognition system, image recognition method, and recording medium| US10089758B1|2017-03-29|2018-10-02|Carestream Health, Inc.|Volume image reconstruction using projection decomposition| US10733748B2|2017-07-24|2020-08-04|Hand Held Products, Inc.|Dual-pattern optical 3D dimensioning| JP6970376B2|2017-12-01|2021-11-24|オムロン株式会社|Image processing system and image processing method| CN110686652B|2019-09-16|2021-07-06|武汉科技大学|Depth measurement method based on combination of depth learning and structured light|
法律状态:
2015-05-20| PLFP| Fee payment|Year of fee payment: 2 | 2015-11-13| PLSC| Search report ready|Effective date: 20151113 | 2016-04-18| PLFP| Fee payment|Year of fee payment: 3 | 2017-05-17| PLFP| Fee payment|Year of fee payment: 4 | 2018-04-27| PLFP| Fee payment|Year of fee payment: 5 | 2020-02-14| ST| Notification of lapse|Effective date: 20200108 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 FR1454153A|FR3020871B1|2014-05-07|2014-05-07|DEVICE AND METHOD FOR THREE-DIMENSIONAL RECONSTRUCTION OF A SCENE BY IMAGE ANALYSIS|FR1454153A| FR3020871B1|2014-05-07|2014-05-07|DEVICE AND METHOD FOR THREE-DIMENSIONAL RECONSTRUCTION OF A SCENE BY IMAGE ANALYSIS| EP15719497.8A| EP3140611B1|2014-05-07|2015-05-06|Device and method for three-dimensional reconstruction of a scene by image analysis| CN201580022349.8A| CN106255863B|2014-05-07|2015-05-06|The device and method of three-dimensionalreconstruction are carried out to scene by image analysis| ES15719497.8T| ES2686195T3|2014-05-07|2015-05-06|Device and procedure of three-dimensional reconstruction of a scene by image analysis| PCT/EP2015/059994| WO2015169873A1|2014-05-07|2015-05-06|Device and method for three-dimensional reconstruction of a scene by image analysis| KR1020167030847A| KR102368453B1|2014-05-07|2015-05-06|Device and method for three-dimensional reconstruction of a scene by image analysis| US15/308,944| US10113867B2|2014-05-07|2015-05-06|Device and method for three-dimensional reconstruction of a scene by image analysis| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|